EVALUATION OF SIMULTANEOUS IDENTITY, AGE AND GENDER RECOGNITION FOR CROWD FACE MONITORING

نویسندگان

چکیده

Nowadays, facial recognition combined with age estimation and gender prediction has been deeply involved the factors associated crowd monitoring. This is considered to be a major complex job for humans. paper proposes unified system based on already available deep learning machine models (i.e., FaceNet, ResNet, Support Vector Machine, AgeNet GenderNet) that automatically simultaneously performs person identification, prediction. Then evaluated newly proposed multi-face, realistic challenging test dataset. The current face technology primarily focuses static datasets of known identities does not focus novel identities. approach suitable continuous In our system, whenever are found during inference, will save those an appropriate label each unique identity updated periodically in order correctly recognise future inference iterations. However, extracting features whole dataset new detected efficient solution. To address this issue, we propose incremental feature extraction training method which aims reduce computational load extraction. When tested dataset, recognizes pre-trained identities, estimates age, predicts average accuracy 49%, 66.5% 93.54% respectively. We conclude can sensitive robust uncontrolled environment (e.g., abrupt lighting conditions).

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Face and Gender Classification in Crowd Video

Research in face and gender recognition under constrained environment has achieved an acceptable level of performance. There have been advancements in face and gender recognition in unconstrained environment, however, there is significant scope of improvement in surveillance domain. Face and gender recognition in such a setting poses a set of challenges including unreliable face detection, mult...

متن کامل

Feature Extraction based Face Recognition, Gender and Age Classification

The face recognition system with large sets of training sets for personal identification normally attains good accuracy. In this paper, we proposed Feature Extraction based Face Recognition, Gender and Age Classification (FEBFRGAC) algorithm with only small training sets and it yields good results even with one image per person. This process involves three stages: Pre-processing, Feature Extrac...

متن کامل

Face , Age and Gender Recognition using Local Descriptors

This thesis focuses on the area of face processing and aims at designing a reliable framework to facilitate face, age, and gender recognition. A Bag-of-Words framework has been optimized for the task of face recognition by evaluating different feature descriptors and different bag-of-words configurations. More specifically, we choose a compact set of features (e.g., descriptors, window location...

متن کامل

Classification of Face Images for Gender, Age, Facial Expression, and Identity

In this paper we compare two models for extracting features from face images and several neural classifiers for their applicability to classify gender, age, facial expression, and identity. These models are i) a description of face images by their projection on independent base images and ii) an Active Appearance Model which describes the shape and grey value variations of the face images. The ...

متن کامل

Face Recognition and Gender Determination

The system presented here is a specialized version of a general object recognition system. Images of faces are represented as graphs, labeled with topographical information and local templates. Di erent poses are represented by di erent graphs. New graphs of faces are generated by an elastic graph matching procedure comparing the new face with a set of precomputed graphs: the \general face know...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: ASEAN Engineering Journal

سال: 2023

ISSN: ['2586-9159']

DOI: https://doi.org/10.11113/aej.v13.17612